Distributed Agents for Web Content Filtering
نویسندگان
چکیده
منابع مشابه
User Model Agents for Distributed Filtering
This paper describes an architecture for intelligent agents derived from user models. The work extends our previous work on user modelling as a basis for various forms of customised systems. We make use of the um toolkit for representation, management and acquisition of the user models. We hav e used um-models as the basis for customisation of a coaching system, a personalised hypertext for tea...
متن کاملDisseminating Mobile Agents for Distributed Information Filtering
An often claimed benefit of mobile agent technology is the reduction of communication cost. Especially the area of information filtering has been proposed for the application of mobile filter agents. However, an effective coordination of agents, which takes into account the current network conditions, is difficult to achieve. This contribution analyses the situation that data distributed among ...
متن کاملNamed Entity Recognition for Web Content Filtering
Effective Web content filtering is a necessity in educational and workplace environments, but current approaches are far from perfect. We discuss a model for text-based intelligent Web content filtering, in which shallow linguistic analysis plays a key role. In order to demonstrate how this model can be realized, we have developed a lexical Named Entity Recognition system, and used it to improv...
متن کاملusing an automatic weighted keywords dictionary for intelligent web content filtering
filtering of web pages with inappropriate contents is one of the major issues in the field of intelligent network's security. having a good intelligent filtering method with high accuracy and speed is needed for any country in order to control users' access to the web. so, it has been considered by many researchers. presenting web pages in an understandable way by machines is one of the most im...
متن کاملWeb Adult Content Detection and Filtering System
This paper describes a Web filtering system “WebGuard,” which aims to automatically detect and filter adult content on the Web. WebGuard uses data mining techniques to classify URLs into two classes: suspect URLs and normal URLs. The suspect URLs are stored in a database, which is constantly and automatically updated in order to reflect the highly dynamic evolution of the Web. When working, Web...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Iraqi Journal for Computers and Informatics
سال: 2016
ISSN: 2520-4912,2313-190X
DOI: 10.25195/ijci.v42i1.77